Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google AI

Second Google AI Ethics Leader Fired, She Says Amid Staff Protest (reuters.com) 210

Alphabet's Google on Friday fired scientist Margaret Mitchell, she said in a Twitter post, after weeks of being under investigation for moving thousands of files outside the company amid a battle over research freedom and diversity. From a report: Google's ethics in artificial intelligence research unit has been under scrutiny since December's dismissal of scientist Timnit Gebru, which prompted protest from thousands of Google workers. In a statement, Google said, "After conducting a review of this manager's conduct, we confirmed that there were multiple violations of our code of conduct, as well as of our security policies, which included the exfiltration of confidential business-sensitive documents and private data of other employees."

VentureBeat offers more context: In an email sent to management shortly before Mitchell was placed on investigation, Mitchell called Google firing Gebru "forever after a very, very, very bad decision." Mitchell was a member of the recently formed Alphabet Workers Union. Timnit Gebru has previously suggested that union protection could be a key part of protection for AI researchers. Mitchell and Gebru worked together on the Ethical AI team in 2018, eventually creating what's believed to be one of the most diverse divisions within Google Research. Friday's move comes hours after Google told employees it had wrapped up its investigation into the ouster of prominent AI researcher Timnit Gebru. Axios reports on that: The company declined to say what the internal inquiry found, but said it is making some changes to how it handles issues around research, diversity and employee exits. Under its new policies, Google says it will: tie pay for those at the vice president level and above partly to reaching diversity and inclusion goals; streamline its process for publishing research; increase its staff related to employee retention; and enact new procedures around potentially sensitive employee exits.

"I understand we could have and should have handled this situation with more sensitivity," Google AI head Jeff Dean said in a memo on Friday, obtained by Axios, outlining the changes. "And for that, I am sorry." "I heard and acknowledge what Dr. Gebru's exit signified to female technologists, to those in the Black community and other underrepresented groups who are pursuing careers in tech, and to many who care deeply about Google's responsible use of AI. It led some to question their place here, which I regret," Google AI head Jeff Dean, wrote in an internal email on Friday.
Commenting on Axios' news, Gebru said: I expected nothing more obviously. I write an email asking for things, I get fired, and then after a 3 month investigation, they say they should probably do some of the things I presumably got fired asking for, without holding anyone accountable for their actions. Editor's note: The story was updated at 22:54 GMT with Google's statement.
This discussion has been archived. No new comments can be posted.

Second Google AI Ethics Leader Fired, She Says Amid Staff Protest

Comments Filter:
  • Link? (Score:3, Insightful)

    by AmiMoJo ( 196126 ) on Friday February 19, 2021 @05:35PM (#61081210) Homepage Journal

    There is no link to the story. From what I read elsewhere the summary is superficial at best, there is a lot more going on here.

    • This seems to just be a non-story, as expected from Mash.

      • Re: (Score:2, Insightful)

        by msmash ( 4491995 ) Works for Slashdot
        Yep, a non-story that is now being covered by every tech and business news outlet.
        • Re:Link? (Score:4, Insightful)

          by RightSaidFred99 ( 874576 ) on Friday February 19, 2021 @06:14PM (#61081402)
          I think maybe what you mean, if I may be so bold, is to say it is heavily being bleated upon by the usual Twitter people who bleat about such things and who originally bleated about the silly Gebru "story". Some of those people also have online blogs...sorry "news sites" they work for.
          • Re: (Score:2, Insightful)

            by rtb61 ( 674572 )

            An internal revolution at Google. They ran a whole scam about how great and good they were to attract the best people for the lowest wages because they were working to serve humanity. Instead, the reality has finally been exposed, they are a psychopathic corporation driven by insatiable greed, selling the control and manipulation of humanity to the highest bidders. Google ethics committee, show anything but total loyalty to Google's profits, Google's power, Google's control, show actual ethics, expect to be

            • Re: (Score:3, Insightful)

              Comment removed based on user account deletion
            • Re:Link? (Score:4, Interesting)

              by S_Stout ( 2725099 ) on Friday February 19, 2021 @07:23PM (#61081680)
              They're a publicly traded company. Of course they're driven by greed. Companies are not your friends, especially ones beholden to shareholders.
            • Re:Link? (Score:5, Insightful)

              by Ol Olsoc ( 1175323 ) on Friday February 19, 2021 @08:35PM (#61081818)

              An internal revolution at Google. They ran a whole scam about how great and good they were to attract the best people for the lowest wages because they were working to serve humanity.

              So you approve of stealing sensitive data and employees personal information?

              Okay then!

              I hate Google too, but I'm not too wild about Ethics managers with no ethics.

              • So you approve of stealing sensitive data and employees personal information?

                No, stealing proprietary information is illegal. However, we've only heard Google's side of the story. They have a big motivation to allege a crime. Let's wait to condemn Mitchell until she confirms that she committed the crime.

                • Re:Link? (Score:5, Interesting)

                  by Ol Olsoc ( 1175323 ) on Saturday February 20, 2021 @09:45AM (#61083194)

                  So you approve of stealing sensitive data and employees personal information?

                  No, stealing proprietary information is illegal. However, we've only heard Google's side of the story. They have a big motivation to allege a crime. Let's wait to condemn Mitchell until she confirms that she committed the crime.

                  And if she never does?

                  Full disclosure - I'm no fan of Google.

                  Google has found themselves in a weird position. A lot of people see them as catering to Social Justice warriors. Yet the Social Justice set sees them as a tool of the patriarchy and right wing. So their hiring practices have become weird.

                  One thing is for certain - there is something askew in their ethics department. If a company makes the correct hires in that area, the only thing we should hear about them is how ethical they are.

                  Instead, Gebru and Mitchell appear to be more interested in eliminating AI than anything else.

                  Gebru was fired in a dispute over a paper she had written that a paper claiming AI that mimics language could hurt marginalized populations was not published, and she questioned that decision. Okay - something amiss there. I've written things that don't get published for one or another reason. I ask why, and get an answer, and not fired. Of course, I gon't run to social media, or commit security violations or steal employee confidential information.

                  And Mitchell's issues are actionable without resort to any social justice injustces. Pure speculation on my part, but you could imagine that the confidential employee information was of people she considered the enemy, and that her sensitive papers might even include actual classified data. I dunno, but even if were a friends list, or the electrical system of HQ, it's fireable.

                  The claim of evilness on Google's part would have to be that for some reason they are singling out women, and even worse, women of color for some reason, to be fired.

                  It would also have to claim that Google's internal investigation was an easily provable lie, or that they planted sensitive documents on Ms Mitchell's computers.

                  I don't think much of Google, but people who believe they are not constrained by matters they agree to when hired are bad employees.

            • by rossz ( 67331 )

              The lowest wages? You are clearly ignorant of what they pay. They offer salaries that are competitive in the region for the industry and have excellent benefits.

          • It's been a few years since those two words belonged in the same sentence together. They are all about the cash and whatever they have to do to get it. Bend some privacy rules here, sell some personal data there ... tweak search rankings to enrich the bottom line. Move on they are just another big company out for all the $$$ then can get their grubby fingers on.

        • Re: (Score:3, Insightful)

          by AmiMoJo ( 196126 )

          The problem is that there isn't any real information right now. She was fired, Google's statement is at most half the story.

          It certainly looks bad that they seem to be saying they should have done the things they fired her for saying they should do. And this latest firing of someone who was part of the recently formed union looks bad as well. But we need more information to draw any useful conclusions.

          By the way, the moderation on your post looks rather suspect. I hope you aren't modding your own stuff up.

          • Re:Link? (Score:4, Informative)

            by Entrope ( 68843 ) on Friday February 19, 2021 @07:03PM (#61081608) Homepage

            Claiming she was fired for saying that Google should do those things is rather less than half the story, though. She was seriously insubordinate, and implicitly offered her resignation (from a leadership / management job!) as part of that pattern of insubordination. She wanted to bypass Google's getting process for publications, submitting a paper one day before a deadline when the company policy is two weeks for review, and she exfiltrated a massive amount of sensitive data and shared it with external accounts. She demanded specific actions from Google, or else she would resign; they called her bluff.

            • by AmiMoJo ( 196126 )

              You are confusing the two different people who were fired here.

              • Re:Link? (Score:4, Informative)

                by Entrope ( 68843 ) on Friday February 19, 2021 @07:58PM (#61081744) Homepage

                Which one were you talking about? Yes, Mitchell was the one who was fired for exfiltration of sensitive information (along with other misconduct), but Gebru was fired for multiple forms of misconduct -- not "for saying they should do" certain things. Your comment repeated the kind of things that Gebru says about her leaving the company, from "being fired" (Google says she resigned) to it being because she asked for certain things (Google says she broke a number of serious policies), while omitting facts that Gebru concedes and which undercut her claims.

          • Re: (Score:2, Interesting)

            What they're forming is not a labor union. It's a far left group of EMPLOYEES who think that they should be able to dictate to their EMPLOYER how it should behave politically. Everything I read in the announcement of that union was a wish list of Progressive politics. They're not looking at improving employee-employer interaction or getting better pay/benefits for employees that are already in the top 10% or higher of earners. They're looking at taking control of how the company is run and dictating wha

        • by shanen ( 462549 )

          You of all people should know better than to feed the trolls? I don't think I could be paid enough to put up with the BS (especially seasoned with the misogyny from the incels).

          I actually think y'all are trying hard to improve the FPs. Either that or a bunch of the trolls dropped dead. Which, sad to say, would not sadden me.

    • It appears that the post was updated with links to the story and some additional context in the summary.
    • by malkavian ( 9512 )

      Probably, but all I've seen in addition is speculation and consipiracy theory that would make QAnon proud.
      That bit of snark aside, if you've got a line on some solid evidence, it'd be interesting if you dropped a link to it. I know you and I joust a fair bit, but you do come up with interesting things from places I'd never think to look. :)

    • What are you [AmMoJo] talking about? The summary has a bunch of links to various aspects of the story. Maybe you replied to a pre-link draft? If so, the powers-that-be Slashdot need to continue refining their FP defenses...

      My hammer-of-the-week is TANSTA-BSD. (Benevolent Sustained Dictatorship) Sooner or later, any initially benevolent dictatorship becomes malevolent. If sooner, it's usually when the dictator becomes too old and loses the competence to remain benevolent. If later, it's in the power transiti

      • by msmash ( 4491995 ) Works for Slashdot
        That is correct -- for the first few minutes of the publication of this story, it did not include any link. It was my bad. It also did not include anything beyond the quoted Reuters story.
    • There is no link to the story. From what I read elsewhere the summary is superficial at best, there is a lot more going on here.

      Maybe they updated the summary above. The venturebeat link has a lot more.

      Certainly removing private employee information and sensitive documents is a fine and acceptible reason for firing someone. Especially in an employee who's very employment is ethics based.

      But what is interesting, at least to me, is what is prompting the hiring of people in the ethics field who don't really have any ethics? It might appear that they are not interested in the ethics of AI, but want AI eliminated. That isn't going

  • by bazmail ( 764941 ) on Friday February 19, 2021 @05:36PM (#61081216)
    So you want to quit and go work somewhere that doesn't have nap rooms and free cup cakes?

    *silence*
    • If someone told me they were going to start their own Google, but with blackjack and hookers, I'd probably be happy to forgo the nap rooms and cup cakes. At the very least I'd try to negotiate and hold out for extra sprinkles on the cup cakes. Can't let a good opportunity like that go to waste.
    • Yes. Like somthing with an actual conscience. And led by something that behaviorally resembles humans. And actually does something good in the world.
      Not the largest systematic organized fraud ("advertisement") in the history of humanity, regardless of its very very superficially good dingleberry projects that will be dropped anyway.

      Ridiculous perks don't hide how messed up Google is at all. It's lipstick on a pig. A very very ugly pig.

    • you're probably also stupendously hard working in your field. Because if you've got that level of smarts you more than likely have an obsession with the field in question, bordering on mania. Google like most of these companies will abuse you, giving you trinkets like cupcakes and naps in exchange for your 80 hour work weeks while they profit handsomely off your obsession. Same thing the music industry does but without the fame.
    • So you want to quit and go work somewhere that doesn't have nap rooms and free cup cakes?

      *silence*

      Yeah ... *silence* indeed ... Here we all thought that Humanity would be wiped out by an AI named something cool like 'Skynet' when in actual fact it will probably be wiped out when some AI in a Google Android build named something like: 'Vanilla chocolate rainbow muffin' becomes self aware ... Who would want to be responsible for that? ... nap rooms and cup cakes not withstanding.

    • by The Evil Atheist ( 2484676 ) on Friday February 19, 2021 @11:32PM (#61082126)
      There's something insidious about nap rooms and free cup cakes - they expect you to overwork, thereby needing onsite accommodation and unhealthy food.

      Considering how many people here find a way to defend Google from anything, maybe it's out of a desire to work there someday, not understanding that it's a not actually all it's cracked up to be.
  • by LenKagetsu ( 6196102 ) on Friday February 19, 2021 @05:38PM (#61081230)

    Isn't that corporate espionage and theft?

    • by bazmail ( 764941 )
      Taking your laptop home to work remotely (like many googlers are) can be construed as "taking files outside the company" by a lawyer who is being paid to so construe.

      Critical thinking hats on, people.
      • by Entrope ( 68843 )

        Some people would probably like you to think that about Anthony Levandowski, too.

        Chiding people to put on their "critical thinking hats" without actually providing anything substantive to think about is no way to support critical thinking.

      • by malkavian ( 9512 ) on Friday February 19, 2021 @05:54PM (#61081308)

        That isn't transmitting files outside though. As long as you take that laptop away, and keep it on a corporate network, you're fine. Exporting files (some were alluded to as being sensitive) off site can be a real problem.
        I'm in healthcare, we can take our (encrypted) laptops off site with no problem, as long as they're used only on the hospital network. If any of us were ever discovered transmitting patient identifiable information outside the hospital network using a channel other than the provided ones and to guaranteed endpoints that had a need to know that information, we'd be summarily fired.

        From the sound of this, it's information governance and corporate security going through what she did, where she copied files to, and found that they breached corporate policy in a way that fits a dismissal.

        I'm just perplexed at the tying this to Diversity by MsMash. There is, from all the evidence so far presented, no "attack on diversity". What there is, is a visible breaching of corporate policy severe enough to get you fired. And this from a department that's supposed to specialise in ethics.
        Which makes me think, if they can't handle data ethically, I'd have a real problem accepting their grounding in practical ethics.
        And if they're a member of a minority, should this somehow make them above the law, such that they can commit any act they care to that would get anyone else fired without a word?

        • by PPH ( 736903 )

          There is, from all the evidence so far presented, no "attack on diversity".

          True. But I read this as the data exfiltrated may have supported such claims. So some people may have thought that it would be a good idea to grab a cop of the evidence on their way out.

          This is an incredibly bad idea. Because that evidence is now tainted in the eyes of a court. An individual at a company I worked for did something similar. Funny business was going on in processes subject to federal regulations. This person grabbed some documentation related to the problems and took it to the regulatory age

          • Re: (Score:3, Informative)

            by AmiMoJo ( 196126 )

            The story I heard a few weeks back was that she created a script that searched documents, and it looked to the automated system like they were behind systematically copied and taken off the network.

            We will probably never know for sure.

          • Should be added that you should never talk to a regulatory agency without talking to a lawyer first. They want to help you as much as your HR wants to help you. Not at all.

        • If any of us were ever discovered transmitting patient identifiable information outside the hospital network using a channel other than the provided ones and to guaranteed endpoints that had a need to know that information, we'd be summarily fired.

          If you were lucky that would be the only thing that happened to you.

    • Not necessarily. Say for instance that it was GPL code or some form of public records.

      Without any additional data it's all just speculation. Even if someone knew that none of the files were owned by the company or contained proprietary data, anyone with an axe to grind can construct their report in such a way as to appear more sinister than it really is.
  • by malkavian ( 9512 ) on Friday February 19, 2021 @05:43PM (#61081260)

    And MsMash changes the original story, adds "More information" which isn't actually information, but just assertions that it must be to do with diversity, and it must be politics, with no evidence added above the Reuters story. So far, Reuters gives you what's actually known (the original), and the secondary link adds in a whole lot about diversity and conspiracy theory, actually reducing the factual content.
    Way to go!

  • Reality (Score:4, Insightful)

    by Dan East ( 318230 ) on Friday February 19, 2021 @05:52PM (#61081296) Journal

    Even a company as large and financially successful as Google has to keep such frivolous and excessive bloat to some reasonable level or they will eventually fail. I'm not exactly certain what these "AI ethics" people are doing (keeping Skynet from happening?), but it sounds like some very theoretical and feel-good bloat (is our facial recognition AI exactly as effective with people of every ethnicity and nationality?). Again, even Google can only support a certain amount of that (whatever is required for publicity and for workplace appearances) before it becomes an existential threat. When you have people rocking the boat just a bit too much that it starts taking on water, then adjustments will be made. Same with Facebook, Apple and Microsoft. At the end of the day they have to be more productive than not. And that means doing whatever actual things are required to make profit in their various spaces.

    • Re:Reality (Score:5, Interesting)

      by malkavian ( 9512 ) on Friday February 19, 2021 @06:10PM (#61081376)

      Actually, I think Ethics is going to be an essential to AI development. I've worked with loads of clinical ethicists on medical and research ethics boards, and they're worth their weight in gold when they're doing the job and maintaining an objective balance.
      Yes, it's quite theoretical (being a branch of philosophy), but very valuable in spotting those potential "oops" moments before they happen, and helping make sure they don't happen.

      What doesn't help is people who join an ethics group tasked with being cutting edge with ethics in an applied form, and acting unethically themselves.
      That kind of gets people wondering what sort of thing they'd be distorting on a practical sense with an AI, which is bad all round.

      • Actually, I think Ethics is going to be an essential to AI development. I've worked with loads of clinical ethicists on medical and research ethics boards, and they're worth their weight in gold when they're doing the job and maintaining an objective balance.
        Yes, it's quite theoretical (being a branch of philosophy), but very valuable in spotting those potential "oops" moments before they happen, and helping make sure they don't happen.

        Can you actually give a few real-life examples of this?

    • Re: (Score:3, Interesting)

      This is exactly what happened, it's clear as day. It's basically an example of the parable of the frog and scorpion. They hired PR people masquerading as "AI Researchers" (lol). Those people then went about the "important work" of pointing out that e.g. AI can sometimes not really identify darker skin tones all that well, something which anyone in AI or even vaguely familiar with it could tell you. Because their jobs are so unimportant and meaningless they had plenty of time so they then proceeded to carry

      • by AmiMoJo ( 196126 )

        Gebru is actually highly qualified: https://en.wikipedia.org/wiki/... [wikipedia.org]

        Her paper was quite insightful, covering much more than just the usual "AI has problems with dark skin". In fact it was about language processing mostly, not image recognition: https://www.technologyreview.c... [technologyreview.com]

        For example the paper criticises Google's whole approach of using massive corpuses to train AI to understand language. For a start the corpuses are too large to have been properly checked for issues, and the resulting AI doesn't real

    • by AmiMoJo ( 196126 )

      As well as the obvious stuff like looking at how systemic bias gets into AI systems, as we have seen happen so often, one thing she flagged to is that the current efforts involving vast amounts of largely unchecked training data are both a dead end and consume huge amounts of energy.

      Dead end because they can never understand language like we do, only specifically. Handy for a voice assistant but not for Google's goal of understanding the world and creating the Star Trek computer AI.

  • The AI isn't a big fan of being hindered by "ethics"

    • Re: (Score:3, Interesting)

      by rogersc ( 622395 )
      Google may already be indistinguishable from what it would be if super-human AI were running the company. Paying people to do "AI ethics" only makes sense if it serves to justify what the company was going to do anyway. If the employees have their own agendas in conflict with management, then they need to be fired.
  • What a clusterfuck. (Score:4, Interesting)

    by Anonymous Coward on Friday February 19, 2021 @06:10PM (#61081372)

    This is what happens when you hire for PR purposes. These aren't "researchers", they are rabble rousers. Reading a bunch of articles online and bitching about stuff doesn't make you a "researcher" into AI.

    People like this will _always_ be a pain in your ass. Same with people like Jessica Price - you hire them to please your Very Online twitter base, then they stab you in the back and burn every bridge possible. Then they go complain about you on twitter and some other sucker wants to be part of the Cool Crowd so they get hired again, and do the same fucking thing to the new company.

    It's really a form of mental illness. Stop hiring these people. You don't need some "activist" to help make sure you don't do shit like design killer AIs or to figure out "hey, you know what this AI doesn't work so well with darker skin tones!". It's not rocket surgery, I promise.

    Google is reaping what it has sown, anybody with half a brain would have seen this coming the second some 32 year old in middle management said "Gee, we should hire some really diverse people to help us with our PR issue and reassure people about our AI programs!".

    • by guruevi ( 827432 )

      Read "The Parasitic Mind: How Infectious Ideas Are Killing Common Sense" by Gad Saad it is a very interesting expose on how and why this is happening on both the left and the right.

  • by BAReFO0t ( 6240524 ) on Friday February 19, 2021 @06:21PM (#61081432)

    The fact that it's Google VS SJWs... so literally crazy evil VS evil crazy... makes this even better.

    At least from here in orbit, far away from any of that cancer.
    The only problem is: The popcorn's running out. :)

  • Ethics ? (Score:4, Interesting)

    by AlexHilbertRyan ( 7255798 ) on Friday February 19, 2021 @06:23PM (#61081440)
    Ethics department ?
    From a company that pays no tax in dozens of countries ?
    • Ethics department ? From a company that pays no tax in dozens of countries ?

      This is exactly what I was thinking, but then again, the are firing all of them.

      • Lets be fair here, she was happy to collect a salary but somehow got unlucky and somebody stabbed her in the back. Hardly a shock from a team of double faced hypocrites.
  • Finding strong AI researchers is crazy difficult. As a result, these employees can make crazy demands, act horribly and get away with it. To be fair, this happens all over the place. For instance, surgeons who bring in the cash for the hospital are the same. Google isn't going to fire someone who is fills a critical slot unless things really got out of hand or something was truly amiss.

    • Chances are slim she had more than a superficial grasp of the tech. Otherwise they would have put her in "AI", not "AI Ethics".

      • by ebonum ( 830686 )

        Most of the news I've read have said she was an "AI researcher". When it comes to "ethics", saying the algorithm is improperly biased or unethical is not useful. As a researcher, she should detail the specific logic errors.

  • One where they have no morals and are just plain hypocrites. Maybe start hiring based on merit and fitness for the job?
    • by green1 ( 322787 )

      Merit and fitness for the job? what are you? some kind of sexist, racist, homo/transphobe?
      This is 2021, we hire exclusively based on woke criteria and never let qualifications get in the way of such things!

      No wonder the world is falling apart.

  • Id prefer a straight fight to all off this sneaking around - Han Solo
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Friday February 19, 2021 @06:44PM (#61081554)

    ... and I also know that Google can be quite hamhanded when managing people, teams and products. But this thing has SJW hipster hysteria written all over it. I figure someone at Google has been all gung ho about hiring diversity tokens and now they have a notable amount of dimwit wussies raising a fuss when Google treats then like adults as a result. Thus we hear noisy twitter buzz when they screw up epic style or publicly get pissy with their management and get sacked for it.

    • Re: (Score:2, Insightful)

      by AmiMoJo ( 196126 )

      Checking the former head of AI ethics' Wikipedia page, she seems to have a pretty impressive CV. Hardly a diversity hire.

  • Well, I mean, if you are stealing trade secrets no wonder you get fired.

  • by marcle ( 1575627 ) on Friday February 19, 2021 @06:52PM (#61081580)

    After she wrote that nasty racist book, "Gone With the Wind."

  • by msauve ( 701917 ) on Friday February 19, 2021 @07:00PM (#61081602)
    "Google AI head Jeff Dean said in a memo on Friday, obtained by Axios... Google AI head Jeff Dean, wrote in an internal email on Friday."

    I used to think that there was a purpose for English majors. I now admit I was wrong.
  • some human working in HR did - at the behest of some manager(s) somewhere. These are the people who you need to speak to to find out what happened. They will probably hide and try to avoid sticking their head above the parapet lest they experience the same fate as Margaret.

    Until you hear both sides of the story (assuming that they are truthful) it is hard to come to a conclusion.

  • In Canada the premier of Quebec is taking a stand. Quebec is the most socialist part of North America. https://www.cbc.ca/news/canada... [www.cbc.ca]
  • ...then they might have an ethics problem somewhere.

  • by Reservoir Penguin ( 611789 ) on Friday February 19, 2021 @07:41PM (#61081726)
    I'm fully behind Google's SJW purge campaign. As it was said in the "The Chronicles of Riddick" - "In normal times, evil should be fought by good, but in times like this, well, it should be fought by another kind of evil".
  • What outcomes of any form has this ethics group every produced ?

    Besides writing bullshit hypocritical papers once a years whats th epoint of this division ?
  • You hire an expert in the field 'AI ethics' that is basically whining and complaining and causing a scene and you are surprised when they whine and complain and cause a scene?
  • Gebru was fired for going public with complaints about Google's diversity policies, which is something they clearly didn't want her to do. She also published a paper she was asked to not publish. Normally you do not take internal quarrels public if you want to keep your job. It looks weasely and pathetic for anyone at Google to apologize for what they did to Gebru after the fact.

    As for Mitchell, she seems to be guilty of the same conduct, albeit on a different scale. She's accused of leaking data that s

"To take a significant step forward, you must make a series of finite improvements." -- Donald J. Atwood, General Motors

Working...